22 research outputs found

    How hand movements and speech tip the balance in cognitive development:A story about children, complexity, coordination, and affordances

    Get PDF
    When someone asks us to explain something, such as how a lever or balance scale works, we spontaneously move our hands and gesture. This is also true for children. Furthermore, children use their hands to discover things and to find out how something works. Previous research has shown that children’s hand movements hereby are ahead of speech, and play a leading role in cognitive development. Explanations for this assumed that cognitive understanding takes place in one’s head, and that hand movements and speech (only) reflect this. However, cognitive understanding arises and consists of the constant interplay between (hand) movements and speech, and someone’s physical and social environment. The physical environment includes task properties, for example, and the social environment includes other people. Therefore, I focused on this constant interplay between hand movements, speech, and the environment, to better understand hand movements’ role in cognitive development. Using science and technology tasks, we found that children’s speech affects hand movements more than the other way around. During difficult tasks the coupling between hand movements and speech becomes even stronger than in easy tasks. Interim changes in task properties differently affect hand movements and speech. Collaborating children coordinate their hand movements and speech, and even their head movements together. The coupling between hand movements and speech is related to age and (school) performance. It is important that teachers attend to children’s hand movements and speech, and arrange their lessons and classrooms such that there is room for both

    Movers and shakers of cognition:Hand movements, speech, task properties, and variability

    Get PDF
    Children move their hands to explore, learn and communicate about hands-on tasks. Their hand movements seem to be “learning” ahead of speech. Children shape their hand movements in accordance with spatial and temporal task properties, such as when they feel an object or simulate its movements. Their speech does not directly correspond to these spatial and temporal task properties, however. We aimed to understand whether and how hand movements' are leading cognitive development due to their ability to correspond to spatiotemporal task properties, while speech is unable to do so. We explored whether hand movements' and speech's variability changed with a change in spatiotemporal task properties, using two variability measures: Diversity indicates adaptation, while Complexity indicates flexibility to adapt. In two experiments, we asked children (4–7 years) to predict and explain about balance scale problems, whereby we either manipulated the length of the balance scale or the mass of the weights after half of the trials. In three out of four conditions, we found a change in Complexity for both hand movements and speech between first and second half of the task. In one of these conditions, we found a relation between the differences in Complexity and Diversity of hand movements and speech. Changes in spatiotemporal task properties thus often influenced both hand movements' and speech's flexibility, but there seem to be differences in how they did so. We provided many directions for future research, to further unravel the relations between hand movements, speech, task properties, variability, and cognitive development

    Asymmetric coupling between gestures and speech during reasoning

    No full text
    When children learn, insights displayed in gestures typically precede insights displayed in speech. In this study, we investigated how this leading role of gestures in cognitive development is evident in (and emerges from) the dynamic coupling between gestures and speech during one task. We investigated 12 children (Mage = 5.4 years) from Kindergarten and first grade who performed an air pressure task. Children’s gestures and speech were coded from video recordings, and levels of reasoning, based on Skill Theory, were assigned. To analyze the dynamic coupling between gestures and speech, Cross Recurrence Quantification Analysis was performed on the two coupled time series. We found gestures to be ahead of speech for children in Kindergarten, but speech and gestures were more temporally aligned for first graders. Furthermore, we found speech to affect gestures more than vice versa for all children, but the degree of this asymmetry of bidirectional regulation differed. In Kindergarten, a higher score on language tests was related to more asymmetry between gestures and speech, while for first graders this relation was present for higher, within-task, levels of understanding. A more balanced, i.e. less asymmetric, coupling between gestures and speech was found to be related to a higher score on math and past tasks, though. Our findings suggest that the relation between gestures, speech and cognitive development is more subtle than previously thought. Specifically, the nature of the coupling between gestures and speech not only expresses but might also predict learning differences between children, both within and across learning domains. We hope our study will foster future research on learning as a dynamic, embodied and embedded process

    I like to move it, move it: Package for social scientists who want to use OpenPose

    No full text
    Package with OpenPose for dummies manual, R-script to transform OpenPose output into correctly ordered csv-file, poster presentation, and demo-video that can be used to try OpenPose

    Asymmetric Dynamic Attunement of Speech and Gestures in the Construction of Children's Understanding Asymmetric Dynamic Attunement of Speech and Gestures in the Construction of Children's Understanding

    No full text
    As children learn they use their speech to express words and their hands to gesture. This study investigates the interplay between real-time gestures and speech as children construct cognitive understanding during a hands-on science task. 12 children (M = 6, F = 6) from Kindergarten (n = 5) and first grade (n = 7) participated in this study. Each verbal utterance and gesture during the task were coded, on a complexity scale derived from dynamic skill theory. To explore the interplay between speech and gestures, we applied a cross recurrence quantification analysis (CRQA) to the two coupled time series of the skill levels of verbalizations and gestures. The analysis focused on (1) the temporal relation between gestures and speech, (2) the relative strength and direction of the interaction between gestures and speech, (3) the relative strength and direction between gestures and speech for different levels of understanding, and (4) relations between CRQA measures and other child characteristics. The results show that older and younger children differ in the (temporal) asymmetry in the gesturesspeech interaction. For younger children, the balance leans more toward gestures leading speech in time, while the balance leans more toward speech leading gestures for older children. Secondly, at the group level, speech attracts gestures in a more dynamically stable fashion than vice versa, and this asymmetry in gestures and speech extends to lower and higher understanding levels. Yet, for older children, the mutual coupling between gestures and speech is more dynamically stable regarding the higher understanding levels. Gestures and speech are more synchronized in time as children are older. A higher score on schools' language tests is related to speech attracting gestures more rigidly and more asymmetry between gestures and speech, only for the less difficult understanding levels. A higher score on math or past science tasks is related to less asymmetry between gestures and speech. The picture that emerges from our analyses suggests that the relation between gestures, speech and cognition is more complex than previously thought. We suggest that temporal differences and asymmetry in influence between gestures and speech arise from simultaneous coordination of synergies

    Can you feel my rhythm? Interpersonal coordination between a child with deafblindness and their mentor

    No full text
    How can you develop language if you are born without proper sight and hearing? Children with congenital deafblindness face profound difficulties in acquiring language, largely due to a lack of access to language in their environment. Indeed, only a few people with congenital deafblindness acquire language beyond the level of naming a limited amount of things, using tactile signs. Language is extremely powerful in extending the boundaries of communication, and not acquiring language proficiency is detrimental for someone’s developmental opportunities throughout life. To enable more children with congenital deafblindness to become more proficient in language, we need to better understand their individual paths to language. Language does not appear and develop all of a sudden, regardless of the presence or absence of any sensory impairments. Instead, language emerges from and builds upon processes underlying interpersonal interaction, such as imitation, attunement, synchronization, and coordination. A detailed understanding of interpersonal interaction between children with deafblindness and significant others promises to yield vital understanding of opportunities to learn language. In this case-study, we investigated interpersonal interaction between a child with congenital deafblindness and their mentor in detail. For this study, we analyzed interpersonal interaction using a video that was previously recorded for an effect study (Huiskens, 2015). We manually coded the harmonicity of the interaction, using an earlier developed coding system (Janssen et al., 2003). In addition, we tracked the hand movements of the child and the mentor in the video. We confined the coding and motion tracking to episodes in the video in which the head and at least one of the hands of both the child and mentor was visible. The child sat at the lap of the mentor and both did not change their posture significantly (e.g. going from sitting to standing). This resulted in the further analyses of two episodes (~ 130 – 150 sec.): One episode with a predominantly harmonious interaction, and one episode with a predominantly disharmonious interaction. For both episodes, we investigated the attunement of velocity and acceleration of child’s and mentor’s hand movements. First, we visualized and described the velocity and acceleration of hand movements over time, across the episodes. Second, we applied Cross-Recurrence Quantification Analysis (CRQA) on the timeseries of the velocity and acceleration of child’s and mentor’s hand movements. CRQA informs about the stability, strength and dynamics of coordination between two coupled dynamical systems, in this case the child and their mentor. We found that the movement profiles in the disharmonious interaction were more capricious than in the harmonious interaction. Furthermore, we found a more evenly balanced leader-follower pattern and more attunement in the harmonious interaction, compared to the disharmonious interaction. Our results thus show that the way in which the child with congenital deafblindness and their mentor move their hands together from moment to moment is closely related to the global nature of their interpersonal interaction. Our study is the first to apply a combination of detailed motion tracking and coordination dynamics analysis with more qualitative methods to investigate the interpersonal interactions between children with deafblindness and significant others. Using these methods, we found that harmonicity of the interaction is evident from the coupling and attunement between the child and their environment over time. Similarly, opportunities for language development arise from coordination between the child and their environment – and in fact are intertwined with harmonicity itself. We therefore believe that more studies on coordination between children and their environment will lead to a better understanding of the many paths leading to language for children with deafblindness
    corecore